skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Tunnell, Christopher"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We perform the first search for ultralight dark matter using a magnetically levitated particle. A submillimeter permanent magnet is levitated in a superconducting trap with a measured force sensitivity of 0.2 fN / Hz . We find no evidence of a signal and derive limits on dark matter coupled to the difference between baryon and lepton number, B L , in the mass range ( 1.10360 1.10485 ) × 10 13 eV / c 2 . Our most stringent limit on the coupling strength is g B L 2.98 × 10 21 . We propose the POLONAISE (Probing Oscillations using Levitated Objects for Novel Accelerometry In Searches of Exotic physics) experiment, which features short-, medium-, and long-term upgrades that will give us leading sensitivity in a wide mass range, demonstrating the promise of this novel quantum sensing technology in the hunt for dark matter. Published by the American Physical Society2025 
    more » « less
    Free, publicly-accessible full text available June 1, 2026
  2. Abstract Multi-dimensional parameter spaces are commonly encountered in physics theories that go beyond the Standard Model. However, they often possess complicated posterior geometries that are expensive to traverse using techniques traditional to astroparticle physics. Several recent innovations, which are only beginning to make their way into this field, have made navigating such complex posteriors possible. These include GPU acceleration, automatic differentiation, and neural-network-guided reparameterization. We apply these advancements to dark matter direct detection experiments in the context of non-standard neutrino interactions and benchmark their performances against traditional nested sampling techniques when conducting Bayesian inference. Compared to nested sampling alone, we find that these techniques increase performance for both nested sampling and Hamiltonian Monte Carlo, accelerating inference by factors of $$\sim 100$$ and $$\sim 60$$, respectively. As nested sampling also evaluates the Bayesian evidence, these advancements can be exploited to improve model comparison performance while retaining compatibility with existing implementations that are widely used in the natural sciences. Using these techniques, we perform the first scan in the neutrino non-standard interactions parameter space for direct detection experiments whereby all parameters are allowed to vary simultaneously. We expect that these advancements are broadly applicable to other areas of astroparticle physics featuring multi-dimensional parameter spaces. 
    more » « less
  3. Traditionally, inference in liquid xenon direct detection dark matter experiments has used estimators of event energy or density estimation of simulated data. Such methods have drawbacks compared to the computation of explicit likelihoods, such as an inability to conduct statistical inference in high-dimensional parameter spaces, or a failure to make use of all available information. In this work, we implement a continuous approximation of an event simulator model within a probabilistic programming framework, allowing for the application of high performance gradient-based inference methods such as the No-U-Turn Sampler. We demonstrate an improvement in inference results, with percent-level decreases in measurement uncertainties. Finally, in the case where some observables can be measured using multiple independent channels, such a method also enables the incorporation of additional information seamlessly, allowing for full use of the available information to be made. 
    more » « less
  4. Abstract (Ultra)light spin-1 particles — dark photons — can constitute all of dark matter (DM) and have beyond Standard Model couplings. This can lead to a coherent, oscillatory signature in terrestrial detectors that depends on the coupling strength. We provide a signal analysis and statistical framework for inferring the properties of such DM by taking into account (i) the stochastic and (ii) the vector nature of the underlying field, along with (iii) the effects due to the Earth's rotation. Owing to equipartition, on time scales shorter than the coherence time the DM field vector typically traces out a fixed ellipse. Taking this ellipse and the rotation of the Earth into account, we highlight a distinctive three-peak signal in Fourier space that can be used to constrain DM coupling strengths. Accounting for all three peaks, we derive latitude-independent constraints on such DM couplings, unlike those stemming from single-peak studies. We apply our framework to the search for ultralightB - LDM using optomechanical sensors, demonstrating the ability to delve into previously unprobed regions of this DM candidate's parameter space. 
    more » « less
  5. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    The Big Science projects common of multi-institute particle-physics collaborations generates unique needs for member management, including paper authorship tracking, shift assignments, subscription to mailing lists and access to 3rd party applications such as Github and Slack. For smaller collaborations under 200 people, often no facility for centralized member management is available and these needs are usually manually handled by long-term members despite the management becoming untenable as collaborations grow. To automate many of these tasks for the expanding XENON collaboration, we developed the XENONnT User Management Website, a web application that stores and updates data related to the collaboration members through the use of Node.js and MongoDB. We found that web frameworks are so mature and approachable such that a student can develop a good system to meet the unique needs of the collaboration. The application allows for the scheduling of shifts for members to coordinate between institutes. User manipulation of 3rd party applications are implemented using REST API integration. The XENONnT User Management Website is open source and is a show case of quick implementation of utility application using the web framework, which demonstrated the utility of web-based approaches for solving specific problems to aid the logistics of running Big Science collaborations. 
    more » « less
  6. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    Effective metadata management is a consistent challenge faced by many scientific experiments. These challenges are magnified by the evolving needs of the experiment, the intricacies of seamlessly integrating a new system with existing analytical frameworks, and the crucial mandate to maintain database integrity. In this work we present the various challenges faced by experiments that produce a large amount of metadata and describe the solution used by the XENON experiment for metadata management. 
    more » « less
  7. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    In dual-phase time-projection chambers there are photosensor arrays arranged to allow for inference of the positions of interactions within the detector. If there is a gap in data left by a broken or saturated photosensors, the inference of the position is less precise and less accurate. As we are unable to repair or replace photosensors once the experiment has begun, we develop methods to estimate the missing signals. Our group is developing a probabilistic graphical model of the correlations between the number of photons detected by adjacent photosensors that represents the probability distribution over photons detected as a Poisson distribution. Determining the posterior probability distribution over a number of photons detected by a sensor then requires integration over a multivariate Poisson distribution, which is computationally intractable for high-dimensions. In this work, we present an approach to quickly calculate and integrate over a multidimensional Poisson distribution. Our approach uses Zarr, a Python array compression package, to manage large multi-dimensional arrays and approximates the log factorial to quickly calculate the Poisson distribution without overflow. 
    more » « less
  8. De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)
    This paper presents a proof-of-concept semi-supervised autoencoder for the energy reconstruction of scattering particle interactions inside dualphase time projection chambers (TPCs), such as XENONnT. This autoencoder model is trained on simulated XENONnT data and is able to simultaneously reconstruct photosensor array hit patterns and infer the number of electrons in the gas gap, which is proportional to the energy of ionization signals in the TPC. Development plans for this autoencoder model are discussed, including future work in developing a faster simulation technique for dual-phase TPCs. 
    more » « less
  9. This work proposes a domain-informed neural network architecture for experimental particle physics, using particle interaction localization with the time-projection chamber (TPC) technology for dark matter research as an example application. A key feature of the signals generated within the TPC is that they allow localization of particle interactions through a process called reconstruction (i.e., inverse-problem regression). While multilayer perceptrons (MLPs) have emerged as a leading contender for reconstruction in TPCs, such a black-box approach does not reflect prior knowledge of the underlying scientific processes. This paper looks anew at neural network-based interaction localization and encodes prior detector knowledge, in terms of both signal characteristics and detector geometry, into the feature encoding and the output layers of a multilayer (deep) neural network. The resulting neural network, termed Domain-informed Neural Network (DiNN), limits the receptive fields of the neurons in the initial feature encoding layers in order to account for the spatially localized nature of the signals produced within the TPC. This aspect of the DiNN, which has similarities with the emerging area of graph neural networks in that the neurons in the initial layers only connect to a handful of neurons in their succeeding layer, significantly reduces the number of parameters in the network in comparison to an MLP. In addition, in order to account for the detector geometry, the output layers of the network are modified using two geometric transformations to ensure the DiNN produces localizations within the interior of the detector. The end result is a neural network architecture that has 60% fewer parameters than an MLP, but that still achieves similar localization performance and provides a path to future architectural developments with improved performance because of their ability to encode additional domain knowledge into the architecture. 
    more » « less
  10. null (Ed.)
    Neutrino experiments study the least understood of the Standard Model particles by observing their direct interactions with matter or searching for ultra-rare signals. The study of neutrinos typically requires overcoming large backgrounds, elusive signals, and small statistics. The introduction of state-of-the-art machine learning tools to solve analysis tasks has made major impacts to these challenges in neutrino experiments across the board. Machine learning algorithms have become an integral tool of neutrino physics, and their development is of great importance to the capabilities of next generation experiments. An understanding of the roadblocks, both human and computational, and the challenges that still exist in the application of these techniques is critical to their proper and beneficial utilization for physics applications. This review presents the current status of machine learning applications for neutrino physics in terms of the challenges and opportunities that are at the intersection between these two fields. 
    more » « less